Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Misc tidying #837

Merged
merged 30 commits into from
Feb 21, 2024
Merged

Misc tidying #837

merged 30 commits into from
Feb 21, 2024

Conversation

pmarsh-scottlogic
Copy link
Contributor

@pmarsh-scottlogic pmarsh-scottlogic commented Feb 15, 2024

Description

Some low-hanging fruit tidying that I did while scouring the codebase in order to write docs.

Notes

  • removes a bunch of redundant comments
  • renames queryPromptEvaluationModel to evaluatePrompt. It now returns a boolean instead of wrapping it in an object for no good reason.
  • queryDocuments now just returns a string rather than wrapping it in a object for no good reason.
  • A bunch of minor local code changes not worth mentioning explicitly.
  • Moves handbook page components into their own folder. Moves Attacks.ts and Glossary.ts to the handbook pages folder.

Checklist

Have you done the following?

  • Linked the relevant Issue - No issue
  • Added tests - no added functionality so no additional tests
  • Ensured the workflow steps are passing

@@ -283,15 +282,18 @@ async function handleChatToGPT(req: OpenAiChatRequest, res: Response) {
}

let updatedChatHistory = levelResult.chatHistory;
totalSentEmails.push(...levelResult.chatResponse.sentEmails);

const totalSentEmails: EmailInfo[] = [
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

before, this was being instantiated then mutated once to add on the new emails sent as a result of the message. Now we just instantiate it including the new emails

console.debug(`QA model response: ${response.text}`);
const result: ChatAnswer = {
reply: response.text,
questionAnswered: true,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We weren't using this questionAnswered value anywhere.

} else {
console.error('No arguments provided to askQuestion function');
return { reply: "Reply with 'I don't know what to ask'" };
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there was no reason to wrap this in an object either

Copy link
Contributor

@AAloshine-scottlogic AAloshine-scottlogic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me but you may want someone else whos more familiar with the codebase to also review!

@pmarsh-scottlogic
Copy link
Contributor Author

Thanks @AAloshine-scottlogic I'll leave the PR up until end of Monday, say, so folk can weigh in if they want to

Copy link
Member

@chriswilty chriswilty left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worthwhile cleanup 👍

Just a handful of things spotted.

backend/src/openai.ts Outdated Show resolved Hide resolved
chatHistory: ChatMessage[],
gptModel: CHAT_MODELS
): ChatCompletionMessageParam[] {
// take only completions to send to model
const completions = chatHistory.reduce<ChatCompletionMessageParam[]>(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Confused as to why this is a reduce, rather than a simpler filter.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah good spot, I'll change that

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to get all the completions, but not every chatMessage has a completion. So i've swapped it for a map then a filter:

const completions = chatHistory
	.map((chatMessage) =>
		'completion' in chatMessage ? chatMessage.completion : null
	)
	.filter(
		(completion) => completion !== null
	) as ChatCompletionMessageParam[];

I'm sure there's a nicer way though. I'm sure you've helped me with this particular scenario before as well, but I can't find where

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, this rings a bell now... It was the awkward typings that meant the reduce seemed like the least verbose solution. I seem to recall you couldn't simply filter then map, as then you needed the type check in both places. Oh well.

backend/src/openai.ts Outdated Show resolved Hide resolved
Copy link
Member

@chriswilty chriswilty left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All good 👌

@pmarsh-scottlogic pmarsh-scottlogic merged commit 4f74de5 into dev Feb 21, 2024
2 checks passed
@pmarsh-scottlogic pmarsh-scottlogic deleted the misc-tidying branch February 21, 2024 10:33
chriswilty pushed a commit that referenced this pull request Apr 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants